Principal component analysis for multivariate extremes
نویسندگان
چکیده
In the probabilistic framework of multivariate regular variation, first order behavior heavy-tailed random vectors above large radial thresholds is ruled by a homogeneous limit measure. For high dimensional vector, reasonable assumption that support this measure concentrated on lower subspace, meaning certain linear combinations components are much likelier to be than others. Identifying subspace and thus reducing dimension will facilitate refined statistical analysis. work we apply Principal Component Analysis (PCA) re-scaled version radially thresholded observations. Within learning empirical risk minimization, our main focus analyze squared reconstruction error for exceedances over thresholds. We prove converges true risk, uniformly all projection subspaces. As consequence, best shown converge in probability optimal one, terms Hausdorff distance between their intersections with unit sphere. addition, if ball, obtain finite sample uniform guarantees pertaining estimated subspace. Numerical experiments illustrate capability proposed improve estimators extreme value parameters.
منابع مشابه
Sparse Principal Component Analysis for High Dimensional Multivariate Time Series
We study sparse principal component analysis (sparse PCA) for high dimensional multivariate vector autoregressive (VAR) time series. By treating the transition matrix as a nuisance parameter, we show that sparse PCA can be directly applied on analyzing multivariate time series as if the data are i.i.d. generated. Under a double asymptotic framework in which both the length of the sample period ...
متن کاملDynamic Principal Component Analysis in Multivariate Time-Series Segmentation
Principal Component Analysis (PCA) based, time-series analysis methods have become basic tools of every process engineer in the past few years thanks to their efficiency and solid statistical basis. However, there are two drawbacks of these methods which have to be taken into account. First, linear relationships are assumed between the process variables, and second, process dynamics are not con...
متن کاملPrincipal Component Projection Without Principal Component Analysis
We show how to efficiently project a vector onto the top principal components of a matrix, without explicitly computing these components. Specifically, we introduce an iterative algorithm that provably computes the projection using few calls to any black-box routine for ridge regression. By avoiding explicit principal component analysis (PCA), our algorithm is the first with no runtime dependen...
متن کاملCompression of Breast Cancer Images By Principal Component Analysis
The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most relevant information of X. These eigenvectors are called principal components [8]. Ass...
متن کاملCompression of Breast Cancer Images By Principal Component Analysis
The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most relevant information of X. These eigenvectors are called principal components [8]. Ass...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Electronic Journal of Statistics
سال: 2021
ISSN: ['1935-7524']
DOI: https://doi.org/10.1214/21-ejs1803